I'm implementing a feature in my iPhone/iPad app where when the iPhone is connected to an external display, the iPhone acts as a controller and the external display shows a non-interactive view. I'm using SwiftUI.
My application is actually quite a similar concept to the one in Apple's documentation. The iPhone/iPad will always be used as a controller while the external display will always show the content, in Apple's case, a game. For this reason, when mirroring the game to QuickTime or StreamLabs, the actual game needs to be mirrored, not the controller. Here's the example I'm talking about in Apple's documentation so that you can visualize it.
Current Implementation
Here's how I've implemented it in the SceneDelegate, following Apple's documentation.
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
guard let windowScene = scene as? UIWindowScene else { return }
if session.role == .windowApplication {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ContentView())
window.makeKeyAndVisible()
}
if session.role == .windowExternalDisplayNonInteractive {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ExternalView())
window.makeKeyAndVisible()
}
}
Where, of course, ContentView should be displayed on the iPhone and ExternalView is displayed on the external display or QuickTime. And here's my info.plist entry if that's relevant to you.
Outcome and the Problem
Ok, so this works fine for most cases, like when screen mirroring from my iPhone to a TV using AirPlay. Both views are displayed correctly. The problem I'm having, though, is that in certain cases, like trying to do a movie recording in QuickTime with the iPhone as the source or adding the iPhone as a video capture device in StreamLabs, the iPhone's screen (with the ContentView) is mirrored, instead of showing the ExternalView. The ExternalView needs to be shown when using these apps.
When looking at UIApplication.shared.openSessions, only one session is listed as described bellow
▿ 1 member
- <UISceneSession: 0x283594ac0; role: UIWindowSceneSessionRoleApplication; persistentIdentifier: ED82F2B9-17EC-435F-8E20-439CECCA92F6> {
scene = <UIWindowScene: 0x117d051e0>;
configuration = <UISceneConfiguration: 0x283595cc0; name: ContentView; role: UIWindowSceneSessionRoleApplication> {
sceneClass = 0x0;
delegateClass = SwiftUI.AppSceneDelegate;
storyboard = 0x0;
};
} #0
- super: NSObject
Is there a way to "create" a new session for screen mirroring? I'm beginning to worry that this is not possible with apps like QuickTime and StreamLabs and not something I can fix on my end.
Anyway, if you have any solutions to this issue I would very much appreciate any feedback at all.
Post
Replies
Boosts
Views
Activity
I cannot seem to create an AVAudioFile from a URL to be played in an AVAudioEngine. Here is my complete code, following the documentation.
import UIKit
import AVKit
import AVFoundation
class ViewController: UIViewController {
let audioEngine = AVAudioEngine()
let audioPlayerNode = AVAudioPlayerNode()
override func viewDidLoad() {
super.viewDidLoad()
streamAudioFromURL(urlString: "https://samplelib.com/lib/preview/mp3/sample-9s.mp3")
}
func streamAudioFromURL(urlString: String) {
guard let url = URL(string: urlString) else {
print("Invalid URL")
return
}
let audioFile = try! AVAudioFile(forReading: url)
let audioEngine = AVAudioEngine()
let playerNode = AVAudioPlayerNode()
audioEngine.attach(playerNode)
audioEngine.connect(playerNode,
to: audioEngine.outputNode,
format: audioFile.processingFormat)
playerNode.scheduleFile(audioFile,
at: nil,
completionCallbackType: .dataPlayedBack) { _ in
/* Handle any work that's necessary after playback. */
}
do {
try audioEngine.start()
playerNode.play()
} catch {
/* Handle the error. */
}
}
}
I am getting the following error on let audioFile = try! AVAudioFile(forReading: url)
Thread 1: Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=com.apple.coreaudio.avfaudio Code=2003334207 "(null)" UserInfo={failed call=ExtAudioFileOpenURL((CFURLRef)fileURL, &_extAudioFile)}
I have tried many other .mp3 file URLs as well as .wav and .m4a and none seem to work. The documentation makes this look so easy but I have been trying for hours to no avail. If you have any suggestions, they would be greatly appreciated!
I'm implementing a feature in my iPhone/iPad app where when the iPhone is connected to an external display, the iPhone acts as a controller and the external display shows a non-interactive view (much like as shown in the image in the following documentation). I'm using SwiftUI. Here's how I've implemented it in the SceneDelegate, following the documentation.
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
guard let windowScene = scene as? UIWindowScene else { return }
if session.role == .windowApplication {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ContentView())
window.makeKeyAndVisible()
}
if session.role == .windowExternalDisplayNonInteractive {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ExternalView())
window.makeKeyAndVisible()
}
}
Where, of course, ContentView should be displayed on the iPhone and ExternalView is displayed on the external display. And here's my info.plist entry if that's relevant to you.
Ok, so this works fine for most cases, like when screen mirroring from my iPhone to a TV using AirPlay. Both views are displayed correctly. The problem I'm having, though, is that in certain cases, like trying to do a movie recording in QuickTime with the iPhone as the source or adding the iPhone as a video capture device in StreamLabs, the iPhone's screen (with the ContentView) is mirrored, instead of showing the ExternalView. The ExternalView needs to be shown when using these apps.
When looking at UIApplication.shared.openSessions, only one session is listed as described bellow
▿ 1 member
- <UISceneSession: 0x283594ac0; role: UIWindowSceneSessionRoleApplication; persistentIdentifier: ED82F2B9-17EC-435F-8E20-439CECCA92F6> {
scene = <UIWindowScene: 0x117d051e0>;
configuration = <UISceneConfiguration: 0x283595cc0; name: ContentView; role: UIWindowSceneSessionRoleApplication> {
sceneClass = 0x0;
delegateClass = SwiftUI.AppSceneDelegate;
storyboard = 0x0;
};
} #0
- super: NSObject
Is there a way to "create" a new session for screen mirroring? I'm beginning to worry that this is not possible with apps like QuickTime and StreamLabs and not something I can fix on my end.
Anyway, if you have any solutions to this issue I would very much appreciate any feedback at all.
I'm implementing a feature in my iPhone/iPad app where when the iPhone is connected to an external display, the iPhone acts as a controller and the external display shows a non-interactive view (much like as shown in the image in the following documentation). I'm using SwiftUI. Here's how I've implemented it in the SceneDelegate, following the documentation.
func scene(_ scene: UIScene, willConnectTo session: UISceneSession, options connectionOptions: UIScene.ConnectionOptions) {
guard let windowScene = scene as? UIWindowScene else { return }
if session.role == .windowApplication {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ContentView())
window.makeKeyAndVisible()
}
if session.role == .windowExternalDisplayNonInteractive {
let window = UIWindow(windowScene: windowScene)
window.rootViewController = UIHostingController(rootView: ExternalView())
window.makeKeyAndVisible()
}
}
Where, of course, ContentView should be displayed on the iPhone and ExternalView is displayed on the external display. And here's my info.plist entry if that's relevant to you.
Ok, so this works fine for most cases, like when screen mirroring from my iPhone to a TV using AirPlay. Both views are displayed correctly. The problem I'm having, though, is that in certain cases, like trying to do a movie recording in QuickTime with the iPhone as the source or adding the iPhone as a video capture device in StreamLabs, the iPhone's screen (with the ContentView) is mirrored, instead of showing the ExternalView. The ExternalView needs to be shown when using these apps.
When looking at UIApplication.shared.openSessions, only one session is listed as described bellow
▿ 1 member
- <UISceneSession: 0x283594ac0; role: UIWindowSceneSessionRoleApplication; persistentIdentifier: ED82F2B9-17EC-435F-8E20-439CECCA92F6> {
scene = <UIWindowScene: 0x117d051e0>;
configuration = <UISceneConfiguration: 0x283595cc0; name: ContentView; role: UIWindowSceneSessionRoleApplication> {
sceneClass = 0x0;
delegateClass = SwiftUI.AppSceneDelegate;
storyboard = 0x0;
};
} #0
- super: NSObject
Is there a way to "create" a new session for screen mirroring? I'm beginning to worry that this is not possible with apps like QuickTime and StreamLabs and not something I can fix on my end.
Anyway, if you have any solutions to this issue I would very much appreciate any feedback at all.
I would like to create an MKMapRect centered at a CLLocationCoordinate2D with a size. You can do this using init(origin:size:), but you give it the origin, not the center which does not work for me. Any suggestions?
I'm working on a navigation app and I would like to snap the userLocation to the MKRoute's Polyline much like how Apple does in Apple Maps. I've added the route's MKPolyline to my MKMapView and it looks great, the user's location is just usually not centered on the road or is off to the side a bit even when using kCLLocationAccuracyBestForNavigation as the CLLocationManager's accuracy. I need it to be centered like Apple Maps as shown bellow.
How would I go about doing this? Is there a way to snap MKAnnotation's to roads (or MKPloylines) and update with the CLLocationManager's userLocation?
I'm creating an app that uses WKWebView. Whenever a user clicks on a link in that webview, I want the link to open up in safari instead of inside the webview. I'm really new at coding, so please make it as easy as possible. I just need the code I need to add. Here's the code bellow:import UIKit
import WebKit
class FirstViewController: UIViewController {
@IBOutlet var containerView: UIView! = nil
var webView: WKWebView?
override func loadView() {
super.loadView()
self.webView = WKWebView()
self.view = self.webView!
}
override func viewDidLoad() {
super.viewDidLoad()
var url = NSURL(string:"http://www.example.com/")
var req = NSURLRequest(url:url as! URL)
self.webView!.load(req as URLRequest)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
}
}I'm using Xcode 8 and Swift 3
Is there a way to make use of the Night Mode feature on iPhone 11s and 12s in camera applications using AVFoundation? I've looked all over the internet and cannot find any solutions, I feel like I have to be missing something. If not, do we know if this will ever become available (especially with ProResRAW shipping later this year)?